11 research outputs found

    Effect preservation in transaction processing in rule triggering systems

    Get PDF
    Rules provide an expressive means for implementing database behavior: They cope with changes and their ramifications. Rules are commonly used for integrity enforcement, i.e., for repairing database actions in a way that integrity constraints are kept. Yet, Rule Triggering Systems fall short in enforcing effect preservation, i.e., guaranteeing that repairing events do not undo each other, and in particular, do not undo the original triggering event. A method for enforcement of effect preservation on updates in general rule triggering systems is suggested. The method derives transactions from rules, and then splits the work between compile time and run time. At compile time, a data structure is constructed, that analyzes the execution sequences of a transaction and computes minimal conditions for effect preservation. The transaction code is augmented with instructions that navigate along the data structure and test the computed minimal conditions. This method produces minimal effect preserving transactions, and under certain conditions, provides meaningful improvement over the quadratic overhead of pure run time procedures. For transactions without loops, the run time overhead is linear in the size of the transaction, and for general transactions, the run time overhead depends linearly on the length of the execution sequence and the number of loop repetitions. The method is currently being implemented within a traditional database system

    Abstracts from the 8th International Conference on cGMP Generators, Effectors and Therapeutic Implications

    Get PDF
    This work was supported by a restricted research grant of Bayer AG

    Ein Simultan-AusfĂĽhrungsschema fĂĽr Datenbank-Cache-Verfahren

    Get PDF
    Database caching techniques promise to improve the performance and scalability of client-server database applications. The task of a cache is to accept requests of clients and to compute them locally on behalf of the server. The content of a cache is filled dynamically based on the application and users' data domain of interest. If data is missing or concurrent access has to be controlled, the computation of the request is completed at the central server. As a result, applications benefit from quick responses of a cache and load is taken from the server. The dynamic nature of a cache, the need of transactional consistency, and the complex nature of a request make database caching a challenging field of research. This thesis presents a novel approach to the shared and parallel execution of stored procedure code between a cache and the server. Every commercial database product provides such stored procedures that are coded in a complete programming language. Given a request in form of such a procedure, we introduce the concept of split twin transactions that logically split the procedure code into two parts, say A and B, such that A is executed at the cache and B at the server in a simultaneous and parallel manner. Furthermore, we analyse the procedure code to detect suitable parts. To the best of our knowledge, this has not yet been addressed by any existing approaches. Within a detailed case study, we show that our novel scheme improves the performance of existing caching approaches. Furthermore, we demonstrate that different load conditions of the system require different sizes of the parts A and B to gain maximal performance. As a result, we extend database caching by a new dimension of optimization, namely by splitting of the procedure code into A and B. To solve this problem of dynamically balancing the code execution between cache and server, we define the maximum performance of a database cache over time and propose a stochastic model to capture the average execution time of a procedure. Based on the execution frequencies of primitive database operations, the model allows us to partially predict the response times for different sizes of A and B, hence providing a partial solution to the optimization problem.Datenbank-Cache-Techniken versprechen eine Verbesserung von Client-Server-Anwendungen hinsichtlich Performance und Skalierbarkeit. Anfragen werden direkt an den Cache geleitet und mit Hilfe lokal gespeicherter Daten ausgeführt. Der Cache-Inhalt wird dabei dynamisch entsprechend dem Verhalten von Anwendungen und Benutzern gefüllt. Fehlen dem Cache Daten oder erfordert die Ausführung der Anfrage die Prüfung transaktionaler Konsistenz, wird die Anfrage vom Server vervollständigt. Anwendungen profitieren von schnellen Antworten des Caches und der Entlastung des Servers. Das dynamische Umfeld, die transaktionale Konsistenz und die Komplexität der Anfragen bilden zusammen einen interessanten Forschungsbereich für Effizienzsteigerung von Client-Server-Systemen. In der vorliegenden Arbeit wird eine neue Technik zur geteilten und parallelen Ausführung von Prozeduren aufgezeigt. Nahezu jedes kommerzielle Datenbanksystem bietet heutzutage derartige Prozeduren, die in einer vollständigen Programmiersprache implementiert sind und auf der Ebene des Datenbanksystems ausgeführt werden. Als Erweiterung für Prozeduren werden Geteilte Zwillingstransaktionen vorgestellt, die den Code logisch in zwei Teile A und B teilen, so dass A vom Cache und B vom Server zeitgleich ausgeführt werden. Mögliche Teilungen des Codes werden vorweg durch eine Code-Analyse ermittelt. Entsprechend meinen Erkenntnissen wurde dieses Problem bisher von keiner der existierenden Techniken betrachtet. In einer detaillierten Fallstudie wird die verbesserte Performance der hier dargestellten Methoden aufgezeigt. Weiterhin wird gezeigt, dass verschiedene Lastbedingungen eine unterschiedliche Dimensionierung der Teile A und B zum Erreichen der maximalen Performance erfordern. Somit eröffnet die richtige Wahl von A und B eine neue Dimension zur Optimierung von Datenbank-Cache-Techniken. Um das Problem des dynamischen Ausbalancierens der Code-Ausführung auf einem Cache sowie dem Server zu lösen, wird die zeitabhängige maximale Cache-Performance und ein stochastisches Modell zur Erfassung der durchschnittlichen Ausführungszeit von Prozeduren definiert. Basierend auf Ausführungshäufigkeiten von elementaren Datenbankoperationen ermöglicht das Modell die partielle Vorhersage von Antwortzeiten für verschiedene Dimensionierungen von A und B. Somit wird eine partielle Lösung des Optimierungsproblems erreicht

    Object Identification Quality

    Get PDF
    Research and industry has tackled the object identification problem of data integration in many different ways. This paper presents a framework, that allows the evaluation of competing approaches. To this end, complexity measures and data characteristics are introduced, which reflect the hardness of a given object identification problem. All characteristics can be estimated by use of simple SQL queries and simple calculations. Following the principle of benchmark definitions we specify a test framework. It consists of a test database and its characteristics, quality criteria, and a test specification. Adequate measures needed for the correctness criterion of the benchmark are given. A running example of the Berlin Online Apartment-Advertisements database (BOA) illustrates the approach. The BOA-database is freely available at www.wiwiss.fu-berlin.de/lenz/boa/. I. MOTIVATION Even though quality cannot be def ined, you know what it is

    Clinical profile and outcome of isolated pulmonary embolism: a systematic review and meta-analysisResearch in context

    No full text
    Summary: Background: Isolated pulmonary embolism (PE) appears to be associated with a specific clinical profile and sequelae compared to deep vein thrombosis (DVT)-associated PE. The objective of this study was to identify clinical characteristics that discriminate both phenotypes, and to characterize their differences in clinical outcome. Methods: We performed a systematic review and meta-analysis of studies comparing PE phenotypes. A systematic search of the electronic databases PubMed and CENTRAL was conducted, from inception until January 27, 2023. Exclusion criteria were irrelevant content, inability to retrieve the article, language other than English or German, the article comprising a review or case study/series, and inappropriate study design. Data on risk factors, clinical characteristics and clinical endpoints were pooled using random-effects meta-analyses. Findings: Fifty studies with 435,768 PE patients were included. In low risk of bias studies, 30% [95% CI 19–42%, I2 = 97%] of PE were isolated. The Factor V Leiden [OR: 0.47, 95% CI 0.37–0.58, I2 = 0%] and prothrombin G20210A mutations [OR: 0.55, 95% CI 0.41–0.75, I2 = 0%] were significantly less prevalent among patients with isolated PE. Female sex [OR: 1.30, 95% CI 1.17–1.45, I2 = 79%], recent invasive surgery [OR: 1.31, 95% CI 1.23–1.41, I2 = 65%], a history of myocardial infarction [OR: 2.07, 95% CI 1.85–2.32, I2 = 0%], left-sided heart failure [OR: 1.70, 95% CI 1.37–2.10, I2 = 76%], peripheral artery disease [OR: 1.36, 95% CI 1.31–1.42, I2 = 0%] and diabetes mellitus [OR: 1.23, 95% CI 1.21–1.25, I2 = 0%] were significantly more frequently represented among isolated PE patients. In a synthesis of clinical outcome data, the risk of recurrent VTE in isolated PE was half that of DVT-associated PE [RR: 0.55, 95% CI 0.44–0.69, I2 = 0%], while the risk of arterial thrombosis was nearly 3-fold higher [RR: 2.93, 95% CI 1.43–6.02, I2 = 0%]. Interpretation: Our findings suggest that isolated PE appears to be a specific entity that may signal a long-term risk of arterial thrombosis. Randomised controlled trials are necessary to establish whether alternative treatment regimens are beneficial for this patient subgroup. Funding: None

    CD40L contributes to angiotensin II-induced pro-thrombotic state, vascular inflammation, oxidative stress and endothelial dysfunction

    No full text
    CD40 ligand (CD40L) is involved in the vascular infiltration of immune cells and pathogenesis of atherosclerosis. Additionally, T cell CD40L release causes platelet, dendritic cell and monocyte activation in thrombosis. However, the role of CD40L in angiotensin II (ATII)-driven vascular dysfunction and hypertension remains incompletely understood. We tested the hypothesis that CD40L contributes to ATII-driven vascular inflammation by promoting platelet-leukocyte activation, vascular infiltration of immune cells and by amplifying oxidative stress. C57BL/6 and CD40L(-/-) mice were infused with ATII (1 mg/kg/day for 7 days) using osmotic minipumps. Vascular function was recorded by isometric tension studies, and reactive oxygen species (ROS) were monitored in blood and heart by optical methods. Western blot, immunohistochemistry, FACS analysis and real-time RT-PCR were used to analyze immune cell distribution, pro-inflammatory cytokines, NAPDH oxidase subunits, T cell transcription factors and other genes of interest. ATII-treated CD40L(-/-) mice showed improved endothelial function, suppression of blood platelet-monocyte interaction (FACS), platelet thrombin generation (calibrated automated thrombography) and coagulation (bleeding time), as well as decreased oxidative stress in the aorta, heart and blood compared to wild-type mice. Moreover, ATII-treated CD40L(-/-) mice displayed decreased levels of TH1 cytokines released by splenic CD4( ) T cells (ELISA) and lower expression levels of NOX-2, T-bet and P-selectin as well as diminished immune cell infiltration in aortic tissue compared to controls. Our results demonstrate that many ATII-induced effects on vascular dysfunction, such as vascular inflammation, oxidative stress and a pro-thrombotic state, are mediated at least in part via CD40L

    Subtype-specific plasma signatures of platelet-related protein releasate in acute pulmonary embolism

    Get PDF
    INTRODUCTION: There is evidence that plasma protein profiles differ in the two subtypes of pulmonary embolism (PE), isolated PE (iPE) and deep vein thrombosis (DVT)-associated PE (DVT-PE), in the acute phase. The aim of this study was to determine specific plasma signatures for proteins related to platelets in acute iPE and DVT-PE compared to isolated DVT (iDVT). METHODS: Within the Genotyping and Molecular Phenotyping of Venous ThromboEmbolism (GMP-VTE) Project, a multicenter prospective cohort study of 693 confirmed VTE cases, a highly sensitive targeted proteomics approach based on dual-antibody proximity extension assay was applied. LASSO-regularized logistic regression analysis selected 33 and 30 of 135 platelet-related candidate proteins in iPE and DVT-PE vs. iDVT, respectively. RESULTS: All regulated proteins were well associated with six prominently released platelet proteins and the majority showed specificity for iPE and DVT-PE compared to iDVT. While iPE-specific proteins were assigned to be predominantly released via shedding mechanisms and extracellular vesicles, granule secretion was identified as a major release mechanism assigned to DVT-associated PE-specific proteins. Network analysis demonstrated three interconnected clusters of specifically regulated proteins in iPE linked to immunoreceptor signaling, pathogen clearance and chemotaxis, whereas for DVT-associated PE one cluster linked to tissue remodeling and leukocyte trafficking. Machine learning-based analysis reveals specific plasma signatures and differential release mechanisms of proteins related to platelets in acute iPE and DVT-associated PE. CONCLUSION: These data suggest that the platelet protein releasate contributes to the differential regulation of plasma proteins in acute PE compared to iDVT, which may be associated with different platelet activation patterns
    corecore